Mark Zuckerberg spent almost the entirety of his opening remarks during Meta's earnings call focused on the many ways the company loses money. This resulted in the company's shares dropping by as much as 19% in extended trading on Wednesday, despite Meta reporting better-than-expected profit and revenue for the first quarter. This article discusses the earning call in detail. Topics mentioned on the call include Meta's plans for turning its AI investment into ad dollars, Llama 3, potential opportunities for expansion within the mixed reality headset market, and Meta's AR glasses.
Thursday, April 25, 2024Mark Zuckerberg outlined why open source is key to Meta's strategy and how Meta plans to support this work.
Mark Zuckerberg believes open source AI is the best way to develop AI technology for several reasons: it allows developers to customize models to their needs, avoids vendor lock-in, protects sensitive data, and is more cost-effective. Zuckerberg also believes that open source AI is safer and more secure because it allows for wider scrutiny and transparency.
Multiple developers and reverse engineers have uncovered references to ad products in the Threads app's code, including the word “ads” itself, as well as references to sponsored items and ads configuration. However, Instagram says it's not testing ads in the app and has no immediate plans to monetize. The head of Instagram stated in April that Meta definitely plans to bring ads to Threads eventually, but Zuckerberg himself has said the team must first focus on retention and improving the basics.
Meta has unveiled a new prototype for augmented reality (AR) glasses, named Orion, which signifies a shift from the company's previous focus on bulky virtual reality (VR) headsets. During the Meta Connect keynote, CEO Mark Zuckerberg showcased these lightweight glasses, weighing only 100 grams, as a glimpse into the future of AR technology. The Orion prototype aims to provide a more comfortable and practical alternative to existing VR devices, which tend to be heavier and less user-friendly. The design of the Orion glasses emphasizes the need for them to be lightweight and resemble traditional eyewear, avoiding the bulkiness associated with VR headsets like the Meta Quest 3. To achieve this, some processing is offloaded to a small wireless "puck" that connects to the glasses, allowing for a more streamlined design. The glasses utilize innovative microprojection technology, where tiny projectors embedded in the arms of the glasses project images into specially designed waveguides. This technology enables the display of holographic images that can be layered over the real world, providing a true augmented reality experience rather than just a passthrough view. Zuckerberg highlighted the challenges of ensuring that the projected images are sharp and bright enough to be visible in various lighting conditions. The Orion glasses boast a field of view of 70 degrees, which is larger than that of competitors like Microsoft's Hololens 2 and Magic Leap One. Users can interact with the holograms through voice commands, hand gestures, and eye tracking, but a notable feature is the "neural interface" wristband. This wristband can detect subtle wrist and finger movements, allowing users to control the AR experience without needing to speak or make large gestures. Overall, the Orion prototype represents Meta's ambition to redefine the AR landscape, moving towards a future where augmented reality is seamlessly integrated into everyday life through lightweight and user-friendly devices.
Mark Zuckerberg envisions a future where augmented reality (AR) glasses, specifically Meta's Orion, will replace smartphones as the primary computing device. During an interview at Meta Connect, he discussed the long development journey of Orion, which has been in the works for nearly a decade. Initially intended as a consumer product, the glasses have evolved into a sophisticated demo due to production costs and technical challenges. Zuckerberg expressed confidence that AR glasses represent the next major platform shift, akin to the transition from desktop to mobile. The partnership with EssilorLuxottica, the eyewear conglomerate behind Ray-Ban, is pivotal for Meta's strategy. Zuckerberg believes that this collaboration could replicate the success Samsung had in the smartphone market, positioning Meta to tap into a potentially massive market for smart glasses. The current iteration of Ray-Ban smart glasses has seen early success, indicating a consumer appetite for stylish, functional eyewear that integrates technology without overwhelming users. Zuckerberg's demeanor during the interview reflected a newfound confidence and a willingness to engage in self-reflection regarding Meta's past controversies, including its role in political discourse and social media's impact on mental health. He acknowledged the challenges of navigating public perception and emphasized a desire for Meta to adopt a nonpartisan stance moving forward. The conversation also touched on the integration of AI into the glasses, enhancing their functionality and user experience. Zuckerberg believes that as AI capabilities grow, users will increasingly rely on glasses for tasks traditionally performed on smartphones, leading to a gradual shift in how people interact with technology. Zuckerberg's insights suggest that while smartphones will not disappear immediately, AR glasses will become more integral to daily life, allowing users to engage with digital content in a more immersive and seamless manner. He anticipates that as technology advances, the glasses will evolve to meet consumer needs, ultimately reshaping the landscape of personal computing.
At the recent Meta Connect 2024 event, CEO Mark Zuckerberg introduced a new feature that allows creators to utilize AI-powered video avatars, enabling them to create realistic digital representations of themselves. This innovation builds on the AI avatars that were first announced at the previous year's Connect event and became available to creators with the launch of Meta's AI Studio platform in July 2024. Initially, these avatars could only interact with fans through text, but the latest update allows for voice conversations across popular platforms such as Messenger, Instagram, and WhatsApp. During a live demonstration, artist and author Don Allen Stevenson III showcased his video avatar, which was able to respond to questions about his creative process. While the avatar's speech patterns and lip synchronization still require refinement, it presented a convincing likeness and demonstrated the potential for creators to enhance their audience engagement. The AI was capable of handling questions smoothly and recognized when inquiries fell outside its expertise. Meta plans to begin testing these video avatars with users in the upcoming year. Although Meta's offering is not entirely groundbreaking—similar technologies are already available from competitors like HeyGen and Synthesia—its focus on creators and seamless integration across Meta's platforms sets it apart. The reception from creators and fans remains to be seen, especially in light of the mixed reactions to Meta's earlier celebrity AI avatars, which have since been discontinued. Chris McKay, the founder and chief editor of Maginative, emphasizes the importance of AI literacy and strategic adoption in the evolving landscape of artificial intelligence.
Meta is currently testing the integration of custom AI-generated content into Instagram and Facebook feeds, a move that has raised significant concerns among users and commentators. The initiative aims to introduce more recommended content from accounts that users do not follow, with the added twist that this content will be entirely generated by Meta's AI. This development has sparked a strong reaction, particularly from those who are wary of the implications of AI in social media. The AI-generated images are designed to be tailored to individual interests and may even feature users themselves, creating scenarios where an AI version of a person's face could appear in their feed. This concept has been met with skepticism, as many question the necessity and appropriateness of such content in personal social media spaces. The announcement was made during Meta's annual AR/VR event, where the company also unveiled various enhancements to its AI assistant, Meta AI, including new voice features and improved video dubbing capabilities. As AI technology continues to advance, the distinction between real and AI-generated content is becoming increasingly blurred. While some users have found value in AI tools, the prospect of having feeds cluttered with AI-generated images—especially those that may not resonate with users—raises concerns about the overall user experience. Earlier attempts by Meta to label AI-generated content were met with mixed results, as the labeling system was not foolproof and sometimes misidentified edited images as AI-generated. Privacy issues also loom large, as users currently cannot opt out of having their publicly shared posts used to train AI models. Mark Zuckerberg has suggested that incorporating AI content into social feeds is a natural progression for social media platforms, but many users are skeptical about the value of such content. The sentiment is that the addition of AI-generated posts could further degrade the user experience, which has already been compromised by an influx of ads and irrelevant recommendations. Overall, the introduction of AI-generated content into social media feeds is seen as a potential misstep that could alienate users further, detracting from the original purpose of these platforms as spaces for genuine connection and interaction. The ongoing evolution of social media, particularly with the integration of AI, continues to provoke debate about the future of online engagement and the balance between innovation and user satisfaction.